Variable selection in high-dimensional linear model with possibly asymmetric errors
نویسندگان
چکیده
In many application areas, the problem of automatic variable selection in a linear model with asymmetric errors is encountered, when number explanatory variables diverges sample size. For this high-dimensional model, penalized least squares method not appropriate and quantile framework makes inference more difficult because non differentiability loss function. An estimation by penalizing expectile process an adaptive LASSO penalty proposed studied. Two cases are considered: first parameters assumed to be much smaller than size afterwards it could same order; two being distinct penalties considered. each case, rate convergence obtained oracle properties estimator established. The estimators evaluated through Monte Carlo simulations compared estimator. also applied real data genetics.
منابع مشابه
High Dimensional Variable Selection.
This paper explores the following question: what kind of statistical guarantees can be given when doing variable selection in high dimensional models? In particular, we look at the error rates and power of some multi-stage regression methods. In the first stage we fit a set of candidate models. In the second stage we select one model by cross-validation. In the third stage we use hypothesis tes...
متن کاملHigh-Dimensional Non-Linear Variable Selection through Hierarchical Kernel Learning
We consider the problem of high-dimensional non-linear variable selection for supervised learning. Our approach is based on performing linear selection among exponentially many appropriately defined positive definite kernels that characterize non-linear interactions between the original variables. To select efficiently from these many kernels, we use the natural hierarchical structure of the pr...
متن کاملVariable Selection for Partially Linear Models with Measurement Errors.
This article focuses on variable selection for partially linear models when the covariates are measured with additive errors. We propose two classes of variable selection procedures, penalized least squares and penalized quantile regression, using the nonconvex penalized principle. The first procedure corrects the bias in the loss function caused by the measurement error by applying the so-call...
متن کاملHigh Dimensional Variable Selection with Error Control
Background. The iterative sure independence screening (ISIS) is a popular method in selecting important variables while maintaining most of the informative variables relevant to the outcome in high throughput data. However, it not only is computationally intensive but also may cause high false discovery rate (FDR). We propose to use the FDR as a screening method to reduce the high dimension to ...
متن کاملUltrahigh Dimensional Variable Selection: beyond the linear model
Variable selection in high-dimensional space characterizes many contemporary problems in scientific discovery and decision making. Many frequently-used techniques are based on independence screening; examples include correlation ranking or feature selection using a twosample t-test in high-dimensional classification. Within the context of the linear model, Fan and Lv (2008) showed that this sim...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Statistics & Data Analysis
سال: 2021
ISSN: ['0167-9473', '1872-7352']
DOI: https://doi.org/10.1016/j.csda.2020.107112